Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Brain network feature identification algorithm for Alzheimer's patients based on MRI image
ZHU Lin, YU Haitao, LEI Xinyu, LIU Jing, WANG Ruofan
Journal of Computer Applications    2020, 40 (8): 2455-2459.   DOI: 10.11772/j.issn.1001-9081.2019122105
Abstract473)      PDF (915KB)(341)       Save
In view of the problem of subjectivity and easy misdiagnosis in the artificial identification of Alzheimer's Disease (AD) through brain imaging, a method of automatic identification of AD by constructing brain network based on Magnetic Resonance Imaging (MRI) image was proposed. Firstly, MRI images were superimposed and were divided into structural blocks, and the Structural SIMilarity (SSIM) between any two structural blocks was calculated to construct the network. Then, the complex network theory was used to extract structural parameters, which were used as the input of machine learning algorithm to realize the AD automatic identification. The analysis found that the classification effect was optimal with two parameters, especially the node betweenness and edge betweenness were taken as the input. Further study found that the classification effect was optimal when MRI image was divided into 27 structural blocks, and the accuracy of weighted network and unweighted network was up to 91.04% and 94.51% respectively. The experimental results show that the complex network of structural similarity based on MRI block division can identify AD with higher accuracy.
Reference | Related Articles | Metrics
Remaining useful life prediction for turbofan engines by genetic algorithm-based selective ensembling and temporal convolutional network
ZHU Lin, NING Qian, LEI Yinjie, CHEN Bingcai
Journal of Computer Applications    2020, 40 (12): 3534-3540.   DOI: 10.11772/j.issn.1001-9081.2020050661
Abstract424)      PDF (970KB)(1017)       Save
As the turbofan engine is one of the core equipment in the field of aerospace, its health condition determines whether the aircraft could work stably and reliably. And the prediction of the Remaining Useful Life (RUL) of turbofan engine is an important part of equipment monitoring and maintenance. In view of the characteristics such as complicated operating conditions, diverse monitoring data, and long time span existing in the turbofan engine monitoring process, a remaining useful life prediction model for turbofan engines integrating Genetic Algorithm-based Selective ENsembling (GASEN) and Temporal Convolutional Network (TCN) (GASEN-TCN) was proposed. Firstly, TCN was used to capture the inner relationship between data under long span, so as to predict the RUL. Then, GASEN was applied to ensemble multiple independent TCNs for enhancing the generalization performance of the model. Finally, the proposed model was compared with the popular machine learning methods and other deep neural networks on the general Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) dataset. Experimental results show that, the proposed model has higher prediction accuracy and lower prediction error than the state-of-the-art Bidirectional Long-Short Term Memory (Bi-LSTM) network under many different operating modes and fault conditions. Taking FD001 dataset as an example:on this dataset, the Root Mean Square Error (RMSE) of the proposed model is 17.08% lower than that of Bi-LSTM, and the relative accuracy (Accuracy) of the proposed model is 12.16% higher than that of Bi-LSTM. It can be seen that the proposed model has considerable application prospect in intelligent overhaul and maintenance of equipment.
Reference | Related Articles | Metrics
Nuclear magnetic resonance logging reservoir permeability prediction method based on deep belief network and kernel extreme learning machine algorithm
ZHU Linqi, ZHANG Chong, ZHOU Xueqing, WEI Yang, HUANG Yuyang, GAO Qiming
Journal of Computer Applications    2017, 37 (10): 3034-3038.   DOI: 10.11772/j.issn.1001-9081.2017.10.3034
Abstract505)      PDF (791KB)(482)       Save
Duing to the complicated pore structure of low porosity and low permeability reservoirs, the prediction accuracy of the existing Nuclear Magnetic Resonance (NMR) logging permeability model for low porosity and low permeability reservoirs is not high. In order to solve the problem, a permeability prediction method based on Deep Belief Network (DBN) algorithm and Kernel Extreme Learning Machine (KELM) algorithm was proposed. The pre-training of DBN model was first carried out, and then the KELM model was placed as a predictor in the trained DBN model. Finally, the Deep Belief Kernel Extreme Learning Machine Network (DBKELMN) model was formed with supervised training by using the training data. Considering that the proposed model should make full use of the information of the transverse relaxation time spectrum which reflected the pore structure, the transverse relaxation time spectrum of NMR logging after discretization was taken as the input, and the permeability was taken as the output. The functional relationship between the transverse relaxation time spectrum of NMR logging and permeability was determined, and the reservoir permeability was predicted based on the functional relationship. The applications of the example show that the permeability prediction method based on DBN algorithm and KELM algorithm is effective and the Mean Absolute Error (MAE) of the prediction sample is 0.34 lower than that of Schlumberger Doll Researchcenter (SDR) model. The experimental results show that the combination of DBN algorithm and KELM algorithm can improve the prediction accuracy of low porosity and low permeability reservoir, and can be used to the exploration and development of oil and gas fields.
Reference | Related Articles | Metrics
Load balancing cloud storage algorithm based on Kademlia
ZHENG Kai, ZHU Lin, CHEN Youguang
Journal of Computer Applications    2015, 35 (3): 643-647.   DOI: 10.11772/j.issn.1001-9081.2015.03.643
Abstract631)      PDF (938KB)(471)       Save

Prevailing cloud storage systems normally use master/slave structure, which may cause performance bottlenecks and scalability problems in some extreme cases. So, fully distributed cloud storage system based on Distributed Hash Table (DHT) technology is becoming a new choice. How to solve load balancing problem for nodes, is the key for this technology to be applicable. The Kademlia algorithm was used to locate storage target in cloud storage system and its load balancing performance was investigated. Considering the load balancing performance of the algorithm significantly decreased in heterogeneous environment, an improved algorithm was proposed, which considered heterogeneous nodes and their storage capacities and distributed loads according to the storage capacity of each node. The simulation results show that the proposed algorithm can effectively improve load balance performance of the system. Compared with the original algorithm, after running a long period (more than 1500 hours in simulation), the number of overloaded nodes in system dropped at an average percentage 7.0%(light load) to 33.7%(heavy load), file saving success rate increased at an average percentage 27.2%(light load) to 35.1%(heavy load), and also its communication overhead is acceptable.

Reference | Related Articles | Metrics
K-anonymity privacy-preserving for trajectory in uncertain environment
ZHU Lin, HUANG Shengbo
Journal of Computer Applications    2015, 35 (12): 3437-3441.   DOI: 10.11772/j.issn.1001-9081.2015.12.3437
Abstract457)      PDF (784KB)(344)       Save
To comprehensively consider the factors influencing the moving objects in uncertain environment, a k-anonymity privacy-preserving method for the trajectory recorded by automatic identification system was presented. Firstly, an uncertain spatial index model was established which was stored in grid quadtree. Then the continuous k-Nearest Neighbor ( KNN) query method was used to find the trajectory which had the similar area to the current trajectory, and the trajectory was added to the anonymous candidate set. By considering the network scale influence on the effectiveness of the anonymous information and the probability of attacker's attack on trajectory, the optimal exploit chain of trajectory was generated by using the heuristic algorithm to strengthen the trajectory privacy-preserving. Finally, the experimental results show that, compared with the traditional method, the proposed method can decrease the information loss by 20% to 50%,while the information distortion can maintain below 50% with the enlarge of query range and the cost loss is cut down by 10% to 30%.The proposed method can effectively prevent malicious attackers from the information access of trajectory,and can be applied for the official boat to law enforcement at sea.
Reference | Related Articles | Metrics
Fast image stitching algorithm based on improved speeded up robust feature
ZHU Lin WANG Ying LIU Shuyun ZHAO Bo
Journal of Computer Applications    2014, 34 (10): 2944-2947.   DOI: 10.11772/j.issn.1001-9081.2014.10.2944
Abstract236)      PDF (639KB)(379)       Save

An fast image stitching algorithm based on improved Speeded Up Robust Feature (SURF) was proposed to overcome the real-time and robustness problems of the original SURF based stitching algorithms. The machine learning method was adopted to build a binary classifier, which identified the critical feature points obtained by SURF and removed the non-critical feature points. In addition, the Relief-F algorithm was used to reduce the dimension of the improved SURF descriptor to accomplish image registration. The weighted threshold fusion algorithm was adopted to achieve seamless image stitching. Several experiments were conducted to verify the real-time performance and robustness of the improved algorithm. Furthermore, the efficiency of image registration and the speed of image stitching were improved.

Reference | Related Articles | Metrics
Routing protocol in multi-channel wireless mesh networks
Yi PENG Lei ZHU Ling LIU
Journal of Computer Applications    2011, 31 (07): 1928-1930.   DOI: 10.3724/SP.J.1087.2011.01928
Abstract1148)      PDF (501KB)(920)       Save
In order to solve the problem that channel resource cannot be fully utilized by single-path routing protocol in multi-channel wireless mesh networks, a Parallel Multi-path Routing Protocol (PMRP) based on congestion control was proposed. This protocol spread a data flow over multiple paths and only re-found new route after all routes have broken. It prevented the congested node to transmit new data flow by utilizing congestion control mechanism. The simulation results demonstrate that, compared with Ad hoc On-demand Distance Vector Routing (AODV) routing protocol, PMRP can reduce the average end to end delay, and improve the data packet delivery ratio and the network throughput effectively.
Reference | Related Articles | Metrics
Optimization of H.264 encoder based on SIMD technology
ZHU Lin,FENG Yan
Journal of Computer Applications    2005, 25 (12): 2798-2799.  
Abstract1618)      PDF (566KB)(1178)       Save
The SIMD(Single-Instruction Multiple-Data) instruction system was introduced,and further Integer DCT(Discrete Cosine Transform),quant,interpolation and motion estimation of H.264 were optimized with the SIMD technology.The experiment indicates that the encoding speed of program after optimization reaches about 30fps and the speed has been improved by 68 times.
Related Articles | Metrics